Sparsity in Multiple Kernel Learning

نویسندگان

  • VLADIMIR KOLTCHINSKII
  • MING YUAN
چکیده

The problem of multiple kernel learning based on penalized empirical risk minimization is discussed. The complexity penalty is determined jointly by the empirical L2 norms and the reproducing kernel Hilbert space (RKHS) norms induced by the kernels with a data-driven choice of regularization parameters. The main focus is on the case when the total number of kernels is large, but only a relatively small number of them is needed to represent the target function, so that the problem is sparse. The goal is to establish oracle inequalities for the excess risk of the resulting prediction rule showing that the method is adaptive both to the unknown design distribution and to the sparsity of the problem.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse and Non-sparse Multiple Kernel Learning for Recognition

The development of Multiple Kernel Techniques has become of particular interest for machine learning researchers in Computer Vision topics like image processing, object classification, and object state recognition. Sparsity-inducing norms along with non-sparse formulations promote different degrees of sparsity at the kernel coefficient level, at the same time permitting non-sparse combination w...

متن کامل

Regularization Strategies and Empirical Bayesian Learning for MKL

Multiple kernel learning (MKL), structured sparsity, and multi-task learning have recently received considerable attention. In this paper, we show how different MKL algorithms can be understood as applications of either regularization on the kernel weights or block-norm-based regularization, which is more common in structured sparsity and multi-task learning. We show that these two regularizati...

متن کامل

Structured Sparsity and Generalization

We present a data dependent generalization bound for a large class of regularized algorithms which implement structured sparsity constraints. The bound can be applied to standard squared-norm regularization, the Lasso, the group Lasso, some versions of the group Lasso with overlapping groups, multiple kernel learning and other regularization schemes. In all these cases competitive results are o...

متن کامل

Variable Sparsity Kernel Learning Variable Sparsity Kernel Learning

This paper presents novel algorithms and applications for a particular class of mixed-norm regularization based Multiple Kernel Learning (MKL) formulations. The formulations assume that the given kernels are grouped and employ l1 norm regularization for promoting sparsity within RKHS norms of each group and lq, q ≥ 2 norm regularization for promoting non-sparse combinations across groups. Vario...

متن کامل

Fast Learning Rate of Multiple Kernel Learning: Trade-Off between Sparsity and Smoothness

We investigate the learning rate of multiple kernel leaning (MKL) with l1 and elastic-net regularizations. The elastic-net regularization is a composition of an l1-regularizer for inducing the sparsity and an l2-regularizer for controlling the smoothness. We focus on a sparse setting where the total number of kernels is large but the number of non-zero components of the ground truth is relative...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010